VisEvol: Visual Analytics to Support Hyperparameter Search through Evolutionary Optimization

نویسندگان

چکیده

During the training phase of machine learning (ML) models, it is usually necessary to configure several hyperparameters. This process computationally intensive and requires an extensive search infer best hyperparameter set for given problem. The challenge exacerbated by fact that most ML models are complex internally, involves trial-and-error processes could remarkably affect predictive result. Moreover, each algorithm potentially intertwined with others, changing might result in unforeseeable impacts on remaining Evolutionary optimization a promising method try address those issues. According this method, performant stored, while remainder improved through crossover mutation inspired genetic algorithms. We present VisEvol, visual analytics tool supports interactive exploration hyperparameters intervention evolutionary procedure. In summary, our proposed helps user generate new evolution eventually explore powerful combinations diverse regions space. outcome voting ensemble (with equal rights) boosts final performance. utility applicability VisEvol demonstrated two use cases interviews experts who evaluated effectiveness tool.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stochastic Hyperparameter Optimization through Hypernetworks

Machine learning models are often tuned by nesting optimization of model weights inside the optimization of hyperparameters. We give a method to collapse this nested optimization into joint stochastic optimization of weights and hyperparameters. Our process trains a neural network to output approximately optimal weights as a function of hyperparameters. We show that our technique converges to l...

متن کامل

Visual Search Analytics: Combining Machine Learning and Interactive Visualization to Support Human-Centred Search

Searching within large online document collections has become a common activity in our modern information-centric society. While simple fact verification tasks are well supported by current search technologies, when the search tasks become more complex, a substantial cognitive burden is placed on the searcher to craft and refine their queries, evaluate and explore among the search results, and ...

متن کامل

Gradient-based Hyperparameter Optimization through Reversible Learning

Tuning hyperparameters of learning algorithms is hard because gradients are usually unavailable. We compute exact gradients of cross-validation performance with respect to all hyperparameters by chaining derivatives backwards through the entire training procedure. These gradients allow us to optimize thousands of hyperparameters, including step-size and momentum schedules, weight initialization...

متن کامل

Hyperparameter Search Space Pruning - A New Component for Sequential Model-Based Hyperparameter Optimization

The optimization of hyperparameters is often done manually or exhaustively but recent work has shown that automatic methods can optimize hyperparameters faster and even achieve better nal performance. Sequential model-based optimization (SMBO) is the current state of the art framework for automatic hyperparameter optimization. Currently, it consists of three components: a surrogate model, an ac...

متن کامل

Learning to Warm-Start Bayesian Hyperparameter Optimization

Hyperparameter optimization undergoes extensive evaluations of validation errors in order to find the best configuration of hyperparameters. Bayesian optimization is now popular for hyperparameter optimization, since it reduces the number of validation error evaluations required. Suppose that we are given a collection of datasets on which hyperparameters are already tuned by either humans with ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computer Graphics Forum

سال: 2021

ISSN: ['1467-8659', '0167-7055']

DOI: https://doi.org/10.1111/cgf.14300